Quake study offers new clues on a California fault’s mystery

Subscribe Now Choose a package that suits your preferences.
Start Free Account Get access to 7 premium stories every month for FREE!
Already a Subscriber? Current print subscriber? Activate your complimentary Digital account.

LOS ANGELES — Thanks to a new method of modeling earthquakes, scientists may now understand why the Parkfield segment of the San Andreas fault — a carefully studied region known for producing moderate temblors every 20 years or so — has been behaving unexpectedly since around the time Ronald Reagan was in the White House.

LOS ANGELES — Thanks to a new method of modeling earthquakes, scientists may now understand why the Parkfield segment of the San Andreas fault — a carefully studied region known for producing moderate temblors every 20 years or so — has been behaving unexpectedly since around the time Ronald Reagan was in the White House.

Taking data collected by sensors on the ground and in space and combining them with observations from laboratory physics experiments, Caltech researchers conducted a computer simulation of tectonic events at Parkfield and discovered that a series of small quakes there may have staved off a larger shaker that geologists predicted would occur in the late 1980s or early 1990s. Instead, the fault produced a magnitude-6.0 quake in 2004, more than a decade behind schedule.

Someday, exercises like this could help scientists make predictions about the worst-case scenario for different spots along a fault line, said Nadia Lapusta, co-author of a study about the research published Thursday in the journal Science.

“You won’t predict when the earthquake will happen, but you will be able to predict what is possible and plan for it,” she said.

The Parkfield segment, which is in Central California at the boundary of the Pacific and North American plates, may be the most closely scrutinized 15-mile length of fault in the world.

The region is known to have produced magnitude 6 earthquakes every 20 years, on average, since at least 1857.

“The short recurrence time gives every generation of scientists an opportunity to make predictions and test observations,” said Sylvain Barbot, a postdoctoral researcher at Caltech who led the study.

The town of Parkfield lures tourists with the motto “Be here when it happens.” Scientists flock to the town, about 20 miles northeast of Paso Robles, to participate in the U.S. Geological Survey’s Parkfield Experiment, which employs instruments that track seismic activity, plate motion, groundwater levels, magnetic field fluctuations and other details to help scientists understand what’s happening underground.

Some of the tools are buried more than a mile below ground. Geologists also measure ground movement using GPS satellites in space.

After a magnitude 6.0 quake rocked the segment in 1966, geologists expected another shaker within about 20 years. They were surprised when nothing significant happened until the 6.0 temblor hit in 2004.

They were even more puzzled, Barbot said, by another characteristic of the 2004 quake: Instead of propagating from the northwestern end of the segment, it started at the southeastern end.

The computer program the Caltech team developed took measurements from Parkfield and plugged them into a model that incorporated laws of friction that had been observed in lab experiments, in which scientists rubbed rocks together to simulate the complicated physical forces underground.

To help the researchers plow through the calculations, Lapusta tuned the model to slow the time scale during earthquakes (plotting data points once every 7 or 8 milliseconds) and speed things up when the fault was quiet (computing changes spaced out as long as two days).

When the scientists ran their program, the simulations reproduced the time delay and change in earthquake origin observed by geologists at Parkfield, suggesting that the model correctly described the physics of the segment.

Barbot said that the team believed that smaller temblors — on the order of 4.0 magnitude — at Parkfield in the 1980s and 1990s redistributed stresses on the fault and delayed a larger quake at the northern location, giving the southern side its opportunity to take over.

“These two places are both good to start an earthquake,” he said. “It’s sort of a competition, and once one goes the other is out of the race.”

The model also predicted that the next quakes at Parkfield would again originate at the southern location, said Andy Michael, a geophysicist with the U.S. Geological Service in Menlo Park, Calif., who was not involved in the research.

“That would be really fascinating,” he said. “Of course, to see if that happens will take another 60 years.”

(EDITORS: STORY CAN END HERE)

The big limiting factor on running these kinds of models isn’t ideas, but computational time, Barbot said.

It took 64 powerful computers a full week to crunch through the data. But the program was able to simulate 200 years of geology at Parkfield. Earlier simulations focused narrowly on short-term events or depended on conceptual descriptions instead of real data inputs to describe longer geologic cycles, Barbot said.

“Our model is the only one I know of that can simulate many cycles of earthquakes — the phases where the fault is locked, then isn’t, then is — on this scale,” he said.

But the program has significant limitations, he added.

For example, it disregards “an elephant in the room” — how interactions with nearby faults affect the physics at Parkfield.

Michael of the USGS said he thought some of the geological assumptions underlying the model probably weren’t correct.

But he also said the research marked a “nice step” toward creating physics-based computer models that reproduce observations on the ground and promise to help scientists make predictions about quakes. They could also make forecasts about how human activity affects seismic events, the researchers wrote in Science.

Barbot said he would also like to see someone study interactions between the Elsinore, San Jacinto and San Andreas faults to assess the possible behavior of a 7.0-or-larger magnitude quake in Southern California. Lapusta is already working on an analysis of the 9.0 Tohoku quake that shattered Japan in 2011.

———

(c)2012 the Los Angeles Times

Visit the Los Angeles Times at www.latimes.com

Distributed by MCT Information Services

—————

Topics: t000026911,g000065558,g000362661,g000066164